# Multilingual understanding

Gemma 3 4b It Uncensored DBL X Int2 Quantized
Pre-trained model based on the Transformers library, suitable for natural language processing tasks
Large Language Model Transformers
G
Kfjjdjdjdhdhd
39
1
Flan T5 Xl Gguf
Apache-2.0
FLAN-T5 is the instruction-fine-tuned version of the T5 model, optimized through fine-tuning on over 1,000 multilingual tasks, delivering superior performance at the same parameter count.
Large Language Model Supports Multiple Languages
F
deepfile
61
8
Infoxlm Large
InfoXLM is a cross-lingual pre-training framework based on information theory, designed to enhance cross-lingual representation learning by maximizing mutual information between different languages.
Large Language Model Transformers
I
microsoft
1.1M
12
Mdeberta V3 Base
MIT
mDeBERTa is the multilingual version of DeBERTa, employing ELECTRA-style pretraining and gradient-disentangled embedding sharing technology, demonstrating excellent performance in cross-lingual tasks like XNLI
Large Language Model Transformers Supports Multiple Languages
M
microsoft
692.08k
179
Distilbert Base Uncased Mnli
DistilBERT is a distilled version of BERT that retains 97% of BERT's performance while being 40% smaller and 60% faster.
Large Language Model Transformers English
D
typeform
74.81k
38
Xlm Align Base
XLM-Align is a pre-trained cross-lingual model supporting 94 languages, improving pre-trained cross-lingual models through self-labeled word alignment.
Large Language Model Transformers
X
microsoft
354
9
Multilingual MiniLM L12 H384
MIT
MiniLM is a compact and efficient pre-trained language model that compresses Transformer models through deep self-attention distillation technology, supporting multilingual understanding and generation tasks.
Large Language Model Supports Multiple Languages
M
microsoft
28.51k
83
Infoxlm Base
InfoXLM is a cross-lingual pre-training framework based on information theory, designed to enhance model performance by maximizing mutual information in cross-lingual tasks.
Large Language Model Transformers
I
microsoft
20.30k
7
Electricidad Small Finetuned Xnli Es
MIT
This is a Spanish-based pre-trained model specifically fine-tuned for cross-lingual natural language inference tasks.
Large Language Model Transformers Supports Multiple Languages
E
mrm8488
18
2
Crosloengual Bert
A trilingual model based on bert-base architecture, specializing in Croatian, Slovenian, and English performance, outperforming multilingual BERT
Large Language Model Supports Multiple Languages
C
EMBEDDIA
510
4
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase